An Interactive Microarray Call Graph

نویسندگان

  • Benjamin Wolfe
  • Michael Shah
  • Ryan O'Connell
چکیده

s_______________________________ In order of Presentation Session A – 15 minute talks An Interactive Microarray CallGraph Visualization Michael Shah | Computer Science Engineering | Ph.D. Student In this paper we present an interactive call graph visualization tool for viewing large programs. Our space-filling grid based visualization shows a program at the function granularity. The grid view provides an overview of the entire system, which allows the user to investigate and view subsets of functions, and jump to source code for more details on demand. In our benchmarks we were able to view and explore code relationships in programs with 18,720 functions at interactive frame rates. We provide several code exploration use cases that can be useful for program comprehension including: Profile-Guided Compiler Optimizations, Java Synchronization, and extracting smaller subgraphs of a software system for program comprehension. Our software visualization tool is Java based and portable across multiple platforms. The Prism of Perception Ryan O'Connell | Music | Masters Student As different and previously separate categories of art begin to merge in the present day, it can become difficult to label or even define a work of art that one experiences. Many modern artists who use sound (including music) and images simultaneously are perfect examples of this categorical vagueness. Using the young, experimental punk/hip-hop trio Death Grips as a vehicle for discussion, I argue that the best mindset for experiencing new works of art is one free from categorization and genre-based pre-conceptions, noting that each individual mind perceives art differently and therefore should not be primed by (now unnecessary) labeling. New multi-color transfer Long Bao | Electrical and Computer Engineering | Ph.D. Student In this talk, two new image multi-color transfer algorithms for still images and image sequences are proposed. The developed tools can be used to capture the artistic ambience or “mood” of the source image and transfer that same ambience to the target image. The performance and effectiveness of these new algorithms are demonstrated through simulations and comparisons to other state of the art methods, including Alla’s, Reinhard’s and Pitie’s methods. These algorithms are straight-forward, automatic, and suitable for various practical recoloring applications, where incorporating coloring, color correction, animation and restoration tools in consumer products is desirable. This work is also useful for fast implementation of special effects for the entertainment industry. You can’t think in binary: challenges in designing tools for robots Aleksandra Kaszowska | Psychology | Ph.D. Student Effective problem solving requires understanding the problem and the desired outcome, and then generating possible solution paths to move between the two. Developing a tool requires engineers to keep the problem in mind, and to consider the end user’s interaction with the tool at the same time. Engineers can draw on their experience with tools through mental simulation. They can increase the tool’s usability by analyzing the user’s cognitive processing requirements for each step of a task, and mentally simulating the interaction with the tool. The initial analysis and mental representation of the problem influences the direction and range of proposed solutions, as well as the efficiency with which a successful solution is found. However mental representations become less relevant when engineers design tools for robots instead of people. Drawing on their own experience when designing tools for a robot, engineers are more likely to yield an erroneous starting point for the design problem. We explore differences in cognitive processes that engineers invoke when designing tools for different populations – humans, robots, or mixed user teams. The end user is sorting a set of LegoTM blocks into predefined box compartments and needs a tool to increase sorting efficiency. Participants conduct a 10 minute “brainstorming” session about a possible tool, followed by a design phase where the idea is refined without a strictly imposed time limit. We examine how the starting point of the task analyses differs with regard to the tool’s end user: what assumptions are made about the end users’ capabilities and limitations? How do these assumptions evolve or get disregarded as the design is refined? Simply speaking: how do we assess if a design idea is a good idea? Recovery of MRI Data Using Low Rank Tensor Models Under Operationally Constrained Sampling Daniel Banco | Electrical and Computer Engineering | Masters Student We investigate the utility of low-rank subspace models for recovery of Magnetic Resonance Imaging (MRI) data from limited sampling in the k-t space for dynamic imaging. In particular, for 3-D MRI data (x-y-time) we employ several tensor factorization techniques and assess the degree of dimensionality reduction, referred to as the compressibility, that can be obtained. This algebraic approach is in contrast to existing compressed sensing (CS) based methods that exploit sparsity in a transform domain, such as wavelets, total variation etc., and are more data adaptive. Further, we compare several tensor factorizations approaches in recovering temporal MRI data from limited sampling. Here the sampling process is restricted to be random along one direction in the k space. Experimental results on synthetically subsampled MRI data show promise in using tensor factorization for sampling and recovery of MRI data, and we present several fruitful research directions. Session B – 15 minute talks Automated Method for Climate Regionalization based on Satellite Remote Sensing Data: An Application to the Spatial Modelling of Urogenital Schistosomiasis in Ghana Madeline Wrable | Civil and Environmental Engineering | Masters Student Regional climate is an important factor in modeling diseases with strong links to environmental risks. However, the existing climate classification schemes have not been designed for public health applications, including predictive modelling of infectious disease transmission. In this work we aim to develop climate regionalization for Ghana using satellite remote sensing data and to conduct spatiotemporal modeling of the water-related disease schistosomiasis. Endemic to the Middle East, Asia, and Africa, schistosomiasis is acquired when people come in contact with contaminated surface water bodies. Transmission of the disease requires specific environmental conditions to sustain freshwater snails, which act as intermediate hosts. In the presented work we expanded the recently developed three-step LKN-method for automated climate regionalization applied to the continental United States. Based on this newly coined approach, we regionalized the West African country, Ghana into eight (subject to additional verification) mutually exclusive, climatic zones. To complete this task we acquired multiple satellite data streams and compiled comprehensive time series for accumulated precipitation, land surface temperature, and vegetative growth indices for 1998-2015. This rich dataset, by its nature, is noisy, redundant and collinear. Following the data limiting step (L-step in LKN method) we reduced dimensionality by applying principal component analysis. We then used the k-means unsupervised classification algorithm to produce candidate regions (K-step). Finally, we applied cluster validity indices to determine the number of candidate climate regions (N-step). In the presented study we plan to regress the resulting climate regionalization against disease data obtained from the national reporting system and to assess the relationship between Ghana’s climate zones and the prevalence of schistosomiasis. The proposed methodology presents an alternative to the existing uses of remote sensing in predicting schistosomiasis transmission. If successful, the results of modeling will inform the development of a decision support framework to design treatment schemes and direct scarce resources to areas with the highest risk of infections. A new method for sampling plant volatiles in the field Eric Scott | Biology | Ph.D. Student Plants emit complex blends of volatile organic compounds to attract mutualists or repel herbivores. Sampling plant volatiles in the field is generally done with dynamic headspace sampling (DHS) which involves enclosing plant parts in a chamber and pumping the headspace through a sorbent trap that can then be desorbed and analyzed by gas chromatography/mass spectrometry. DHS is limited by equipment and has the potential to introduce artifacts due to increased temperature, humidity and blockage of UVB light in the chamber. Our novel method uses a sorptive polymer (PDMS) held in direct contact with a plant part with small magnets. The volatiles can then be thermally desorbed from the PDMS and analyzed by GC/MS. This method doesn’t require enclosing the plant part in a chamber and therefore eliminates any potential artifacts introduced by increased temperature and humidity and requires very minimal equipment in the field. We compare the sensitivity of direct-contact sorptive polymer sampling with traditional dynamic headspace sampling, explore the effects of sampling time, and test the method’s sensitivity to background contamination (e.g. from neighboring plants). We believe this method has the potential to allow for easier, more robust, and more cost-effective sampling of plant volatiles in field conditions. Earthquake nucleating frictional instabilities on geological faults Sohom Ray | Civil and Environmental Engineering | Ph.D. Student The theory of dynamical system allows us to understand detailed characteristics of a phenomenon that evolves with time or any other variable according to rules specified by a set of equations. The problem of quasi-static development of earthquake nucleating instabilities can be viewed as an evolving dynamical system [Viesca, 2016]. Crustal-earthquakes are the result of a stick-slip frictional instability on a pre-existing fault surface. Mechanics models a geological fault as a shear crack in an elastic continuum. A (linear) elastic stress analysis, under static condition, determines that the shear stress along the fault drops from the loading stress due to non-uniform slip in an elastic medium. A slowly slipping fault is resisted by an additional frictional stress (due to non-zero normal traction). Modern view of friction comes from laboratory rock friction experiments show that low-velocity fault friction may have a direct and subsequent evolutionary response to changes in slip velocity; the magnitude of which are respectively proportional to parameters a and b in constitutive relations of such rateand statedependent friction [e.g., Dieterich 1979; Ruina, 1983]. External loading and the elastic shear stress act as driving agents of the slip on the fault while the frictional stress tries to inhibit the slip. Under these two opposing agents the slipping region of the fault evolves in a non-linear manner up to a critical length beyond which the crack becomes compliant enough to set an instability. This is called as the nucleation phase of the earthquake.\\ The manner of evolution of frictional-instability is heavily sensitive to the constitutive formulation of the strength (and parameters therein). In case of rate-and state-dependent friction, when $a$ and $b$ are uniform on a fault, translational invariance implies any location is a potential nucleation site, the choice determined by pre-instability conditions and external forcing. In this work we show that heterogeneous distribution of the parameters can create preferred nucleation sites on the fault. We also present an alternate way of handling the coupled and nonlinear system of slip rate and state. The dynamical system framework of studying earthquake-nucleation boils down to essentially solving a linear system for its fixed-points. Velocity-instabilities corresponds to the stable fixedpoint attractors. Stability analysis of the fixed-point solutions show that the heterogeneity of frictional properties can rule out infinite number of sites on a fault where earthquake slip-instabilities cannot nucleate. Further, the distribution of frictional properties can (uniquely) determine of sites of stable fixed-point attractors which leads to a priori determination of preferable nucleation sites independent of initial and loading conditions. Cultivating Listeners: Reimagining Environmental Communication through North American Literature Emma Schneider | English | Ph.D. Student When it comes to environmental issues, there is not a lack of outcry; there is a lack of listening. Literature provides a space that reimagines how and to whom we listen and illustrates the importance of listening beyond the dominant narratives that have caused and justified our destructive relationship with nature. Grounding my presentation in the writings of authors Simon Ortiz and Winona LaDuke, I will present an introduction to how contemporary North American novels and poetry unpack the layered sounds and silences of industry and institutional violence and ask readers to listen beyond the loudest noises to those that are muffled or muted. In so doing, I argue that (re)learning to listen promotes environmental healing and helps us to hear information that is already present, but often ignored. Throughout, this presentation brings attention to the role of the humanities in responding to issues of environmental justice by envisioning alternate ways of perceiving and interacting with our world. Tracking the Fire thief: The Movement Mystery of an Endangered Hawaiian Waterbird Charles van Rees | Biology | Ph.D. Student The Hawaiian gallinule (Gallinula galeata sandvicensis) persists on the island of Oahu despite habitat loss in excess of 75% and the high prevalence of invasive predators. Remnant populations rely on protected areas and artificial habitats like golf courses, botanical gardens, and wet agricultural areas, but these areas are managed in a diversity of ways, and separated by an inhospitable urban landscape. The small populations inhabiting habitat patches are at elevated risk of extinction if not linked by immigration and subsequent gene flow. Understanding the connectivity of remaining Hawaiian gallinule populations is of critical importance to the species’ persistence. I describe the current findings of an extensive, ongoing mark-resight study of `alae `ula on Oahu, and preliminary results of a population genetics analysis in which 83 individuals from 9 different wetlands were genotyped at 13 microsatellite loci and along 495 bp of the ND2 region of mitochondrial DNA. Remarkably few inter-wetland movements have been documented in Oahu’s population, and these observations are corroborated by evidence of moderate genetic structuring among Oahu’s HAGA, even at very small scales (<5km). The scale at which this structuring is occurring is small for avian taxa, even for sedentary tropical species, and may be the result of the rapid urbanization of Oahu’s landscape. Finally, I discuss plans to enhance data collection for this study and possible management options for enhancing connectivity. Session C – 15 minute talks Atomic force microscopy to determine mechanical properties and pericellular layer in cortical neurons and guinea pig fibroblast cells Vivekanand Kalaparthi | Mechanical Engineering | Ph.D. Student Mechanical properties play a key role in defining cellular functionality, motility, tissue formation and stem cell differentiation. Changes in cell stiffness are a caveat in identifying abnormalities in the disease phenotypes, aging and cell differentiation. The nature of stiffness can be quantified by using atomic force microscopy as indentation tool and subsequently processing the data using mathematical models. Although homogeneity of the cell material may be considered as a reasonable approximation, the pericellular coat ( layer of polysaccharides and proteins) and corrugation of the pericellular membrane (microvilli and microridges) is not taken into account. Thus, it not only results in discrepancy in quantifying elastic modulus but also losing vital information of the entropic nature of the prericellular layer(or brush layer) such as brush length and grafting density. In this proposal we address the nature of pericellular brush, in two distinguish cases: cortical neuron cell and fibroblast guinea pig cells addressing the features of brush and reinforcing the validity of the brush model previously published. What are you looking AT? What we can learn from studying DNA in yeast. Simran Kaushal | Biology | Ph.D. Student I am studying common fragile sites, which are regions of DNA that physically break in all human beings when their cells are under stress. Common fragile sites are very vulnerable, and they especially become an issue in cancerous cells. Cancerous cells undergo a lot of DNA breaks, and preferentially break at common fragile sites. These breaks can cause the whole cell's genome to reorganize, which can disrupt many normal cellular functions and may also help cancer become worse over time. My goal is to understand why common fragile sites break at the molecular level, which can provide insight into how cancer cells rearrange their genomes and why cancer gets worse over time. I am using baker’s yeast to study common fragile sites, as it is a simple eukaryotic organism with many of the same genetic components as humans. Baker’s yeast has been used as a model organism in labs for hundreds of years, so it also has many genetic assays that I can exploit for my study of common fragile sites. I am using yeast to look at small regions of common fragile site DNA to carefully observe their genetic components mechanisms of fragility. By looking at small DNA regions in yeast, I hope to understand overarching conditions that must be present in order for common fragile sites to exhibit their characteristic breakage. Anti-vascular endothelial growth factor treatment normalizes tuberculosis granuloma vasculature and improves small molecule delivery Meenal Datta | Chemical and Biological Engineering | Ph.D. Student Tuberculosis (TB) causes almost 2 million deaths annually, and an increasing number of patients are resistant to existing therapies. TB patients require lengthy chemotherapy, possibly because of poor penetration of antibiotics into granulomas where the bacilli reside. Granulomas are morphologically similar to solid cancerous tumors in that they contain hypoxic microenvironments and can be highly fibrotic. Here we show that TB-infected rabbits have impaired small molecule distribution into these disease sites due to a functionally abnormal vasculature, with a low molecular weight tracer accumulating only in peripheral regions of granulomatous lesions. Granulomaassociated vessels are morphologically and spatially heterogeneous, with poor vessel pericyte coverage in both human and experimental rabbit TB granulomas. Moreover, we found enhanced vascular endothelial growth factor (VEGF) expression in both species. In tumors, anti-angiogenic, specifically anti-VEGF, treatments can “normalize” their vasculature, reducing hypoxia and creating a window-of-opportunity for concurrent chemotherapy; thus, we investigated vessel normalization in rabbit TB granulomas. Treatment of TB-infected rabbits with the anti-VEGF antibody bevacizumab significantly decreased the total number of vessels while normalizing those that remained. As a result, hypoxic fractions of these granulomas were reduced and small molecule tracer delivery increased. These findings demonstrate that bevacizumab treatment promotes vascular normalization, improves small molecule delivery, and decreases hypoxia in TB granulomas, thereby providing a potential new avenue to improve delivery and efficacy of current treatment regimens. Probing the Secrets of Interfacial Structure Jing Wang | Chemistry | Ph.D. Student Digging into the molecular level secrets of interfaces requires a spectroscopic approach. Sum frequency generation (SFG) spectroscopy plays an important role in characterizing the interfacial structures. The issue is that SFG is a nonlinear process and the response is complex in the mathematical sense. To resolve the response into the molecular components, one must measure both the amplitude and the phase of the nonlinear response. We invented a SFG nonlinear interferometer to address this issue. Results using octadecyltrichlorosilane (OTS) on fused silica serves as a test system will be described. Session D – 15 minute talks Hip Hoppin' Around: Using Social Network Analysis to Identify trends in the hip hop network of the 1990s Kerrlene Wills | International Relations | Masters Student Using Social Network Analysis this project highlights trends in the hip-hop network of the 1990s. This decade, often referred to as “The Golden Era of Hip Hop”, was one of the most transformative eras for rap music. It saw the rise of ‘gangsta’ rap, which originated in the West Coast, reflected one of the most dangerous times for the genre when rap rivalries turned violent, and creatively depicted the struggle of African Americans in urban cities for the first time. In the 1990s, hip-hop was more than just a genre of music; it was a way of life. This SNA project examines the hip-hop way of life by exploring the following questions: 1. What was the West Coast v. East Coast rivalry really about? This infamous hip-hop battle is often thought of as a rivalry between regions. However, through SNA, this project was able to show that it was smaller than rap gurus portray it to be. The East Coast v West Coast battle was really a rivalry between Tupac (West Coast) and a few rap artist on the East Coast, specifically from Bad Boy Records. 2. Could Method Man have saved Notorious BIG and Tupac? SNA reveals that Method Man was one of the most well connected rappers in the 90s. He was a member of the Wu-Tang Clan, worked with both Tupac and Biggie Smalls, and had connections with other rappers in the game. This project reveals that Method Man could have been the peacekeeper and possibly prevented the untimely end of hip-hop’s most beloved rappers. 3. Did hip-hop rivalries equate to album sales? This was a difficult question to answer, however, by using centrality measures and attribute data, this project reveals that the most profitable rap albums were the ones that were embedded in the ‘Coast Rivalry’. 4. How did your network influence your Grammy win? In the hip-hop world, the Grammy’s have been the most controversial yet coveted award. While Biggie and Tupac had some of the highest ranking and most profitable albums, they never won a Grammy for their successes. This project uses SNA to examine how an artist’s network influenced their Grammy win/nomination. 5. Finally, the most difficult question of all to answer, If BIG was still alive, which rapper of the decade would not have been so popular? This part of the SNA project looks at centrality measures to identify a rap artist that BIG would have overshadowed if he were still alive. My prediction is JayZ. A neural field model of repetition effects in early time-course ERPs in spoken word perception Andy Valenti | Computer Science | Ph.D. Student Previous attempts at modeling the neuro-cognitive mechanisms underlying word processing have used connectionist architectures (Laszlo & Plaut, 2012; Rabovsky & McRae, 2014; Sadeghi, Scheutz, Pu, Holcomb, & Midgley, 2013) and none has modeled spoken word architectures. A well-studied mechanism for analyzing language processing is the Event-Related Potential (ERP), a time-synchronized component of an Electroencephalograph (EEG) pattern (Luck, 2014). In this paper, we present a model of early time course ERP word repetition effects based on neural fields. This represents a new modeling approach to studying the neurocognitive processes, one that is based on the bottom-up interaction of sensory information with higher-level categories of cognitive processing. Tap-to-household water quality deterioration: analysis of spatial and temporal uncertainties. Tania Alarcon Falconi | Civil and Environmental Engineering | Ph.D. Student A fundamental aspect of any statistical analysis is the understanding and characterization of uncertainties and their effects on statistical inferences. Observational studies attempt to answer questions when fully-controlled experimental design is impractical or impossible. As a result, even well thought-out observational studies might have a high level of uncertainty due to incomplete, imprecise, or error-prone data by the virtue of collected information. Methods used to address those uncertainties rely on spatial and temporal assumptions regarding dose, time, and location of exposure. In practice, the effects of these assumptions on statistical results are rarely evaluated. In this study, we explored the effects of spatiotemporal uncertainties on the association between microbiological tap-to-household water quality deterioration (total and fecal coliform concentrations) and enteric infections in 160 households of urban slums in Vellore, India. Our fully geocoded dataset included weekly records of diarrheal infections; monthly measurements of water quality in public taps of a piped distribution system and in household storage containers; and household-level demographic and behavioral factors affecting water, sanitation and hygiene practices. Households had to store water due to intermittent and infrequent public water supply, however the source tap from which households collected their water and the time of collection was unknown. To determine a possible source tap for each household sample, we created GIS-based tap-to-household links for scenarios constructed on a same-day temporal assumption and spatial assumptions (linear versus network distances; and closest distance versus an inverse-distance weighted average). Regardless of assumptions, approximately 50% of all households had higher fecal coliform water concentrations than the assumed source. Association of tap-to-household fecal coliform deterioration and occurrence of diarrheal infections was highly dependent on spatiotemporal assumptions. The geospatial and statistical methods used in this study provide a foundation to further develop a systematic approach to characterize data uncertainties in epidemiological research. Roles of Social Stress in Cocaine Use Disorders Xiao Han | Psychology | Ph.D. Student Social stress can escalate cocaine intake and cause cocaine relapse in humans. However, the neural mechanisms of social stress underlying drug addiction are still unclear. The current study aims to investigate how social stress influences cocaine self-administration and cocaine relapse in mice. We employed a resident-intruder social defeat stress mouse model, mimicking social stress in humans. We first successfully found that social stress can escalate cocaine self-administration and cocaine seeking in mice. Next, we used pharmacological method to microinject corticotrophin releasing factor receptor 1(CRF R1) antagonist into the ventral tegmental area (VTA), a brain area implicated in drug addiction. We found that VTA CRF R1 antagonist significantly reversed the social stress effects. After confirming the roles of VTA CRF neurons, we employed a new technique named designer receptors exclusively activated by designer drugs (DREADDs), which allowed us to target the CRF neurons in a specific brain region. By activating the CRF neurons in the VTA, we found DREADDs mice self-administered significantly more cocaine than controls. Results demonstrate that CRF receptors in the VTA are necessary for social stress to engender escalated cocaine self-administration. In addition, DREADDs activation of CRF in the VTA increases cocaine self-administration, which mimics the effects of social defeat stress. These findings provide convincing support for involvement of social stress in the drug abuse and suggest novel therapeutic approaches for stress induced drug use disorders. Session E – 5 minute talks Ultrabright fluorescent silica nanoparticles as nanothermometers Vivekanand Kalaparthi | Mechanical Engineering | Ph.D. Student Self-assembled mesoporous silica particles are an important class of materials because of their numerous advantages. Here we are interested in a sub-class of this material type, the MCM141, which has ordered cylindrical channels with pore size 3-3.5 nm. Mesoporous silica particles can be synthesized in different sizes ranging from few microns to tens of nanometers. These particles exhibit optical transparency, a high thermal conductivity, robust mechanical properties and can tolerate a varied range of change in pH. Mesoporous silica has been the material of choice due to its good biocompatibility, low toxicity, and ease of functionalization with sensing molecules. Using various reactive moieties such as carboxylic acids, biotin, streptavidin, amine, thiol, etc., one can attach specific sensing molecules to a labeling fluorescent particle, such as antibodies, various proteins, peptides, nucleic acids, aptamers, small molecules, and even liposomes. A direct application of these particles can be realized as fluorescent labels when organic fluorophores are physically entrapped inside the nanochannels during the synthesis itself. Because the dye is well protected inside the channels, the fluorescent brightness improves to orders of magnitude in comparison to other existing fluorescence nanostructures. Thus, this increased nature of brightness is coined as ultrabrightness. A potential application of these particles is in labeling of biological cells and for prescreening tests, such as morbidity classification of cells in cancer. Utilizing the platform of ultrabrightness and all the other advantages mentioned above, here we present the development of fluorescence ratiometeric nanothermometry. Nanoparticles will be synthesized with two different dyes (one responds to temperature change and the second does not) that form Foster resonance energy transfer (FRET) pair. This results in a flexibility of using light source of a particular wavelength to excite the donar dye in the FRET pair, while the energy is will be transferred to the acceptor. Applications of such systems include measurements at high spatial resolution down to a few tens of nanometers for probing subcellular regions inside the cells, microfluidics and in theranostics were convetional thermometry severely fails due to size factor and accuracies in temperature measurements. These collodoidal nanoparticles can exhibit a spatial resolution of ~40 nm with temperature accuracy of 0.5 degree Celsius. The Transcriptomic Basis of Metamorphic Competence Robert Burns | Biology | Ph.D. Student Many marine invertebrates depend on their larvae as the dispersal mechanism for their species. In many of these species, the larvae are not capable of metamorphosing for hours to weeks after being released into the plankton. The mechanisms that govern this transition between pre-competent larvae which are not able to metamorphose to metamorphically competent larvae are unknown. I studied the pre-competent and competent larvae of the salt-marsh dwelling polychaete worm Capitella teleta, a species in which pre-competent larvae are unusually easy to distinguish from competent larvae. I sequenced the transcriptomes of the pre-competent and competent larvae of C. teleta to determine differences in gene expression between these two functionally different larval stages. By performing differential expression analyses, I found that 1541 genes were up-regulated in precompetent larvae while 1063 genes were up-regulated in competent larvae. Pre-competent larvae had upregulated the expression of genes belonging to gene ontologies having to do with growth and development while the competent larvae had up-regulated ligand-binding transmembrane channels with possible chemo and mechanosensory functions. The majority of these channels were annotated as being from the Degenerin/Epithelial sodium channel family or G-protein coupled receptor family; proteins from these families can have chemosensory functions. Serotonin and GABA receptors are among the genes that were up-regulated in competent larvae; both have been shown to induce larvae of C. teleta and other marine invertebrates to metamorphose and are thought to be components of the signal transduction pathway that leads to the initiation of metamorphosis. Overall, it appears that once larvae of C. teleta have completed development of internal structures required for juvenile life, competent larvae then up-regulate the expression of possible chemosensory proteins and neurotransmitter receptors to make the detection and transduction of a signal from external settlement cues possible. A Level Set Approach to Simulating Xenopus laevis Tail Regeneration Zachary Serlin | Mechanical Engineering | Masters Student A framework for predictively linking cell-level signaling with larger scale patterning in regeneration and growth has yet to be created within the field of regenerative biology. If this could be achieved, regeneration (controlled cell growth), cancer (uncontrolled cell growth), and birth defects (mispatterning of cell growth) could be more easily understood and manipulated. My research looks, from a mechanical engineering prospective, to create a key part of this preliminary framework by using level set methods to describe cell growth and a cellular control scheme to determine behavior of individual cells. This simulation specifically looks at Xenopus laevis tail regeneration, and shows promise in creating an abstracted model to predict cell patterning on the macroscopic level. The growth model can be tuned to match experimental regeneration results and is a stepping-stone for future work in creating a more detailed large-scale patterning model. Microbes in a Pickle; what fermented vegetables can tell us about microbiomes Esther Miller | Biology | Ph.D. Student Microbes are abound in the environment but what are the factors that govern how microbial communities are assembled? Using sterile cabbages and in-vitro sauerkrauts, I aim to unravel some of the key ecological questions facing microbial scientists, food producers and you, a consumer! Skill versus strategy in conditional probability judgments Aleksandra Kaszowska | Psychology | Ph.D. Student We make judgments based on conditional probability on a daily basis, even though we are surprisingly bad at interpreting complex statistical information. Performance differences across users are rooted in representations of available data (textual or visual), and the users’ ability to extract relevant information from said representations. Spatial ability – cognitive ability to mentally represent and manipulate twoand threedimensional object representations – is a crucial factor influencing performance on statistical reasoning tasks. Ottley and colleagues (2015) assessed participants' accuracy in statistical reasoning tasks, where they varied how the data was presented – as a textual scenario, as a visualization, or as a combination of the two. People with high spatial ability consistently outperformed low spatial ability users on accuracy regardless of data format. But while low spatial ability users produced low accuracy scores across all data presentations, high spatial ability users benefitted more from single mode representations and showed a significant decline in accuracy when viewing both textual and visualization information at the same time. It is possible that when presented with dual mode data presentation, high spatial ability users try to integrate the text and visualization and fail, while low spatial ability users choose to only pay attention to one mode of presentation provided. This theory poses a bigger question: are there data viewing strategies that influence judgment accuracy? We employ eye tracking methodology to investigate if visual scanning patterns play a role in interpreting statistical information, and explore potential links between visual attention deployment and spatial ability. A Facial Recognition System for Matching Computerized Composite Sketches to Facial Photos Using Human Visual System Algorithms Qianwen Wan | Electrical Engineering | Ph.D. Student Autonomous facial recognition system has been considered as one of the most important tools for law enforcement applications because of its dramatic progress over the past decades. Automatic facial recognition system can quickly help determine the identity of criminals and narrow down the potential suspects; though, in many cases the facial photo of a suspect is not available. Hence drawing a sketch by an artist following the description provided by the eyewitness or the victim is a commonly used method to assist police to identify possible suspect. As a matter of fact, the forensic sketches technique has been used as far back as the 19th century. However, it usually costs a large amount of money and energy to employ forensic sketch artists who are competent and proficient; due to the reasons of budgetary, many law enforcement agencies use facial composite software to create computerized composite sketches. Recently, computer composite sketches are widely used by law enforcement agencies in criminal activities. However, there is only a limited amout of research on the computerized sketches to face photos recognition. Most publishd facial recognition work is focusing on photo-tophoto recognition or hand drawn sketch-to-photo recognition. Therefore, there is a need to study the computer composite sketch to face photos recognition system. The contribution of this paper includes: (i)First of all, we attempt to develop an automatically matching computerized sketch to facial photo recognition system. Our novel automatic matching computerized sketch to facial photo recognition architecture consists a training part and a testing part. In each part we have four separate procedures. They are the Human Visual System based image decomposition, Logarithm Local Binary Pattern feature extraction, classification and facial component region weighting (ii)Second, we created a database containing people of interest with both their forensic and computerized sketches, which can be used for future researchers. To build this database, we added computerized composite sketches to part of the CUHK student database, the AR database and the database of Tufts University Panetta Imaging and Simulation Lab. (iii)Finally, we also compared the efficiency and accuracy between matching forensic photos and matching the composite sketches, which is very valuable since it is hard to find an available public dataset. As a conclusion, our automatic computerized sketch to facial photo recognition system can solve the composite sketch to facial photo matching problem, in a low-cost, accurate and efficient manner. We believe that it is important to aid identifying suspects in criminal activities and finding lost children. "His Natural Colour Began to Rub Off" Matt DiCintio | Drama & Dance | Ph.D. Student In 1796, Harry Moss created a sensation in Philadelphia when he put himself on display at the Black Horse Tavern. Moss was a free-born African American, whose skin, at age 38, had begun to turn white. Modern physicians would diagnose Moss with the pigment-deficient condition vitiligo, but in the first decade of the United States, Moss's appearance challenged the racialized identities that were part of eighteenth-century thought and that were written into the Constitution. In submitting himself to the gaze of white Philadelphians and in articulating to them the story of his condition, Moss resisted contemporary considerations of blacks as indolent, unthinking, and degenerate. Moss’s skin was literally turning white, just as he was figuratively “acting white.” If a black man could turn white so easily, how easy could it be for whites to turn black? How easily could whites then be enslaved? What were the true justifications for maintaining slavery? My presentation considers how Moss’s audience attempted to answer these questions — all the while making Moss rich and famous. Multivariate regression to link macroeconomics, hydrology and water-related capacities Agustin Botteron | Civil and Environmental Engineering | Masters Student A recent CIA report asserted that water defines this planet as no other resource, and no country is immune to the challenges that we face regarding water. Water Security is a topic increasingly becoming central in global conversations, such as food security and poverty reduction. Water security involves managing scarcity and abundance of, and uncertainty in water resources, so that they do not place an intolerable burden on society and its supporting economy. It is claimed that regions with “easy” hydrology (constant annual rainfall and river streamflow) are blessed and bound to be rich. Conversely, regions with “complicated” hydrology (seasonal and/or inter-annual variability in rainfall and streamflow) are doomed to be poor, unless investments in infrastructure and institutions are deployed. This paper contributes to the ongoing discussion by building a multivariate regression model to link hydrology, human interventions and economic growth for 159 global basins with population larger than 2 million people. We found that per capita Gross Basin Product (GBPpc) is impacted negatively by seasonal runoff variability and positively by the increase in public services and water-management capacities within the basin. Our model explains more than 84% of the variability in the response variable. We expect our results to be helpful to governments and multilateral institutions for the decision-making process in terms of water security and sustainable development. Towards MacGyver Robots Vasanth Sarathy | Computer Science | Ph.D. Student McGyver was a television series in the 1980s featuring a resourceful secret agent who solved complex problems, often accompanying life-or-death situations, by using common objects around him in inventive ways. As it happens, many natural human activities involve this form of reasoning, albeit not always as dramatic or as clever. Using a mug as a paperweight or folding a piece of paper and propping it under a door to keep it open are examples of reasoning tasks that involve generating novel uses for objects. This type of reasoning is influenced not only by an object's physical and functional features, but also changing contexts, social norms, historical precedence, and a degree of uncertainty. Learning how and when to use objects is a highly desirable skill for robots, as well, as they assist our elderly or navigate unforgiving terrain. In this research, we are developing a computational framework to reason about ``affordances," the term used to capture inter-relationship between humans or robots and their environment. We seek to answer fascinating questions about perception and creativity that lie at the intersection of artificial intelligence, cognitive science and philosophy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relevance of a Toll-Free Call Service Using an Interactive Voice Server to Strengthen Health System Governance and Responsiveness in Burkina Faso

Background In Africa, health systems are poorly accessible, inequitable, and unresponsive. People rarely have either the confidence or the opportunity to express their opinions. In Burkina Faso, there is a political will to improve governance and responsiveness to create a more relevant and equitable health system. Given their development in Africa, information and communication technolog...

متن کامل

NetAffx Gene Ontology Mining Tool: a visual approach for microarray data analysis.

SUMMARY The NetAffx Gene Ontology (GO) Mining Tool is a web-based, interactive tool that permits traversal of the GO graph in the context of microarray data. It accepts a list of Affymetrix probe sets and renders a GO graph as a heat map colored according to significance measurements. The rendered graph is interactive, with nodes linked to public web sites and to lists of the relevant probe set...

متن کامل

Affordances and limitations of technology: Voices from EFL teachers and learners

With the developments of new technologies appearing very quickly, the attention has been focused more on technology than learning. English centers and institutes have mostly been busy accommodating new programs and technologies and hence have not spent enough time to evaluate the CALL programs and technologies employed to find their affordances and limitations. The present study was an attempt ...

متن کامل

JexVis: An interactive visualization tool for exception call graphs in Java

Just as the structure of any system degrades over time as it evolves & becomes more complicated, software systems are no exception to this. A particular aspect of software systems is its exception handling mechanism which also tends to get dirty with time. Jex, a tool developed by Martin Robillard at University of British Columbia is a static analyzer that extracts exception flow information fr...

متن کامل

CARRIE web service: automated transcriptional regulatory network inference and interactive analysis

We present an intuitive and interactive web service for CARRIE (Computational Ascertainment of Regulatory Relationships Inferred from Expression). CARRIE is a computational method that analyzes microarray and promoter sequence data to infer a transcriptional regulatory network from the response to a specific stimulus. This service displays an interactive graph of the inferred network and provid...

متن کامل

Interactive color mosaic and dendrogram displays for signal/noise optimization in microarray data analysis

Data analysis and visualization is strongly influenced by noise and noise filters. There are multiple sources of “noise” in microarray data analysis, but signal/noise ratios are rarely optimized, or even considered. Here, we report a noise analysis of a novel 13 million oligonucleotide dataset 25 human U133A (~500,000 features) profiles of patient muscle biposies. We use our recently described ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016